45 research outputs found
How Many Pairwise Preferences Do We Need to Rank A Graph Consistently?
We consider the problem of optimal recovery of true ranking of items from
a randomly chosen subset of their pairwise preferences. It is well known that
without any further assumption, one requires a sample size of for
the purpose. We analyze the problem with an additional structure of relational
graph over the items added with an assumption of
\emph{locality}: Neighboring items are similar in their rankings. Noting the
preferential nature of the data, we choose to embed not the graph, but, its
\emph{strong product} to capture the pairwise node relationships. Furthermore,
unlike existing literature that uses Laplacian embedding for graph based
learning problems, we use a richer class of graph
embeddings---\emph{orthonormal representations}---that includes (normalized)
Laplacian as its special case. Our proposed algorithm, {\it Pref-Rank},
predicts the underlying ranking using an SVM based approach over the chosen
embedding of the product graph, and is the first to provide \emph{statistical
consistency} on two ranking losses: \emph{Kendall's tau} and \emph{Spearman's
footrule}, with a required sample complexity of pairs, being the \emph{chromatic
number} of the complement graph . Clearly, our sample complexity is
smaller for dense graphs, with characterizing the degree of node
connectivity, which is also intuitive due to the locality assumption e.g.
for union of -cliques, or for random
and power law graphs etc.---a quantity much smaller than the fundamental limit
of for large . This, for the first time, relates ranking
complexity to structural properties of the graph. We also report experimental
evaluations on different synthetic and real datasets, where our algorithm is
shown to outperform the state-of-the-art methods.Comment: In Thirty-Third AAAI Conference on Artificial Intelligence, 201
Integrated mmWave Access and Backhaul in 5G: Bandwidth Partitioning and Downlink Analysis
With the increasing network densification, it has become exceedingly
difficult to provide traditional fiber backhaul access to each cell site, which
is especially true for small cell base stations (SBSs). The increasing maturity
of millimeter wave (mmWave) communication has opened up the possibility of
providing high-speed wireless backhaul to such cell sites. Since mmWave is also
suitable for access links, the third generation partnership project (3GPP) is
envisioning an integrated access and backhaul (IAB) architecture for the fifth
generation (5G) cellular networks in which the same infrastructure and spectral
resources will be used for both access and backhaul. In this paper, we develop
an analytical framework for IAB-enabled cellular network using which we provide
an accurate characterization of its downlink rate coverage probability. Using
this, we study the performance of two backhaul bandwidth (BW) partition
strategies, (i) equal partition: when all SBSs obtain equal share of the
backhaul BW, and (ii) load-based partition: when the backhaul BW share of an
SBS is proportional to its load. Our analysis shows that depending on the
choice of the partition strategy, there exists an optimal split of access and
backhaul BW for which the rate coverage is maximized. Further, there exists a
critical volume of cell-load (total number of users) beyond which the gains
provided by the IAB-enabled network disappear and its performance converges to
that of the traditional macro-only network with no SBSs